Article ID Journal Published Year Pages File Type
523982 Journal of Informetrics 2014 11 Pages PDF
Abstract

•An information source composed with n random variables may be split into 2n events.•Once the maximum entropy computed, the unused capacity and the efficiency of the source may follow.•The transmission power is defined as the efficiency of the transmission between the source's variables.•The transmission power indicates how strong the synergy or the control exerted within the source is.

In this paper, we show that an information source composed with n random variables may be split into 2n or 2n − 1 “states”; therefore, one could compute the maximum entropy of the source. We derive the efficiency and the unused capacity of an information source. We demonstrate that in more than two dimensions, the transmission's variability depends on the system configuration; thus, we determine the upper and the lower bounds to the mutual information and propose the transmission power as an indicator of the Triple Helix of university–industry–government relationships. The transmission power is defined as the fraction of the total ‘configurational information’ produced in a system; it appears like the efficiency of the transmission and may be interpreted as the strength of the variables dependency, the strength of the synergy between the system's variable or the strength of information flow within the system.

Related Topics
Physical Sciences and Engineering Computer Science Computer Science Applications
Authors
,