Permutation entropy (PE), as one of the effective complexity metrics to represent the complexity of time series, has the merits of simple calculation and high calculation efficiency. In view of the limitations of PE, weighted-permutation entropy (WPE) and reverse permutation entropy (RPE) were proposed to improve the performance of PE, respectively. WPE introduces amplitude information to weigh each arrangement pattern, it not only can better reveal the complexity of time series with a sudden change of amplitude, but also has better robustness to noise; by introducing distance information, RPE is defined as the distance to white noise, it has the reverse trend to traditional PE and has better stability for time series of different lengths. In this paper, we propose a novel complexity metric incorporating distance and amplitude information and name it reverse weighted-permutation entropy (RWPE), which incorporated the advantages of both WPE and RPE. Four simulation experiments were conducted including entropy curve testing with two probabilities, mutation signal complexity testing, robustness testing to noise based on complexity, and complexity testing of time series with various lengths. The simulation results show that RWPE can be used as a complexity metric, which has the ability to accurately detect the abrupt amplitudes of time series and has better robustness to noise. Moreover, it also shows greater stability than other three kinds of PE for time series with various lengths.
Systemic risks have to be vigilantly guided against at all times in order to prevent their contagion across stock markets. New policies also may not work as desired, instead even induce shocks to market, especially those emerging ones. Therefore, timely detection of systemic risks and policy-induced shocks is crucial to safeguard the health of stock markets. In this paper, we show that the relative entropy or Kullback-Liebler divergence can be used to identify systemic risks and policy-induced shocks in stock markets. Concretely, we analyzed the minutely data of two stock indices, the Dow Jones Industrial Average (DJIA) and the Shanghai Stock Exchange (SSE) Composite Index, and examined the temporal variation of relative entropy for them. We show that clustered peaks in relative entropy curves can accurately identify the timing of the 2007–2008 global financial crisis and its precursors, and the 2015 stock crashes in China. Moreover, a sharpest needle-like peak in relative entropy curves, especially for SSE market, always served as a precursor of an unusual market, a strong bull market or a bubble, thus possessing a certain ability of forewarning.
It has been conjectured that the origin of the fundamental molecules of life, their proliferation over the surface of Earth, and their complexation through time, are examples of photochemical dissipative structuring, dissipative proliferation, and dissipative selection, respectively, arising out of the non-equilibrium conditions created on Earth’s surface by the solar photon spectrum. Here I describe the non-equilibrium thermodynamics and the photochemical mechanisms involved in the synthesis and evolution of the fundamental molecules of life from simpler more common precursor molecules under the long wavelength UVC and UVB solar photons prevailing at Earth’s surface during the Archean. Dissipative structuring through photochemical mechanisms leads to carbon based UVC pigments with peaked conical intersections which endow them with a large photon disipative capacity (broad wavelength absorption and rapid radiationless dexcitation). Dissipative proliferation occurs when the photochemical dissipative structuring becomes autocatalytic. Dissipative selection arises when fluctuations lead the system to new stationary states (corresponding to different molecular concentration profiles) of greater dissipative capacity as predicted by the universal evolution criterion of Classical Irreversible Thermodynamic theory established by Onsager, Glansdorff, and Prigogine. An example of the UV photochemical dissipative structuring, proliferation, and selection of the nucleobase adenine from an aqueous solution of HCN under UVC light is given.
The aim of this work was to analyze in the Entropy-Complexity plane (HxC) time series coming from ECG, with the objective to discriminate recordings from two different groups of patients: normal sinus rhythm and cardiac arrhythmias.
The HxC plane used in this study was constituted by the Shannon’s Entropy as one of its axis, and the other was composed by the statistical complexity. To compute the entropy the probability distribution function (PDF) of the observed data was obtained using the methodology proposed by Bandt & Pompe (2002) .
The database used in the present study was the ECG recordings obtained from PhysioNet, 47 long-term signals of patients with diagnosed cardiac arrhythmias and 18 long-term signals from normal sinus rhythm patients were processed. Average values of statistical complexity and normalized Shannon entropy were calculated and analyzed in the HxC plane for each time series.
The average values of complexity of ECG of patients with diagnosed arrhythmias were bigger than normal sinus rhythm group. On the other hand, the Shannon entropy average values for arrhythmias patients were lower than the normal sinus rhythm group. This characteristic made possible discriminate the position of both signals’ groups in the HxC plane. The results were analyzed through a multivariate statistical test hypothesis.
The methodology proposed has a remarkable conceptual simplicity, and shows a promissory efficiency in the detection of cardiovascular pathologies.
The notion that the brain has a resting state mode of functioning has received increasing attention in recent years. The idea derives from experimental observations that showed a relatively spatially and temporally uniform high level of neuronal activity when no explicit task was being performed. Surprisingly, the total energy consumption supporting neuronal firing in this conscious awake baseline state is orders of magnitude larger than the energy changes during stimulation. This paper presents a novel and counter-intuitive explanation of the high energy consumption of the brain at rest obtained using the recently developed intelligence and embodiment hypothesis. This hypothesis is based on evolutionary neuroscience and postulates the existence of a common information-processing principle associated with nervous systems that evolved naturally and serves as the foundation from which intelligence can emerge and to the efficiency of brain's computations. The high energy consumption of the brain at rest is shown to be related to the most probable state of an equilibrium statistical mechanics model aimed at capturing the behavior of a system constrained by power consumption and evolutionary designed to minimize metabolic demands.
In this talk, we propose an information theoretic approach to design the functional representations to extract the hidden common structure shared by a set of random variables. The main idea is to measure the common information between the random variables by the Watanabe's total correlation, and then find the hidden attributes of these random variables such that common information between these random variables is reduced the most given these hidden attributes. We show that these hidden attributes can be characterized by an exponential family specified by the eigen-decomposition of some pairwise joint distribution matrix. Then, we adopt the log-likelihood functions for estimating these hidden attributes as the desired functional representations of the random variables, and show that these functional representations are informative to describe the common structure. Moreover, we design both the multivariate alternative conditional expectation (MACE) algorithm to compute the proposed functional representations for discrete data, and a novel neural network training scheme for continuous or high-dimensional data. Finally, the performances of our algorithms are validated by numerical simulations in the MNIST digital recognition.
A key element of thermodynamics is small entropy fluctuations away from local maxima at equilibrium. These fluctuating entropy decreases become proportionally larger as the volume decreases within some thermodynamic system. Lessened entropy indicates the formation of organized fluctuating mesoscopic structures. These structures depend upon the microscopic interactions present among the atomic constituents of the system. Entropy fluctuations may be represented by a thermodynamic information metric yielding directly a thermodynamic Ricci curvature scalar R. R is a thermodynamic invariant that is a measure of mesoscopic structure formation within the system. In my talk, I discuss the calculation and the physical interpretation of R in several scenarios: fluid systems, including supercooled liquid water, simple solids, spin systems, quantum fluid models, the quark-meson plasma, and black hole thermodynamics. This range of applications offers a strong argument for the effectiveness of R within thermodynamics.
The development of systematic coarse-grained mesoscopic models for complex molecular systems is an intense research area. Here we first give an overview of different methods for obtaining optimal parametrized coarse-grained models, starting from detailed atomistic representation for high dimensional molecular systems. We focus on methods based on information theory, such as relative entropy, showing that they provide parameterizations of coarse-grained models at equilibrium by minimizing a fitting functional over a parameter space. We also connect them with structural-based (inverse Boltzmann) and force matching methods. All the methods mentioned in principle are employed to approximate a many-body potential, the (n-body) potential of mean force, describing the equilibrium distribution of coarse-grained sites observed in simulations of atomically detailed models. We also present in a mathematically consistent way the entropy and force matching methods and their equivalence, which we derive for general nonlinear coarse-graining maps. We apply, and compare, the above-described methodologies in several molecular systems: a simple fluid (methane), water and a polymer (polyethylene) bulk system. Finally, for the latter we also provide reliable confidence intervals using a statistical analysis resampling technique, the bootstrap method.