Article ID Journal Published Year Pages File Type
407872 Neurocomputing 2014 7 Pages PDF
Abstract

To find hidden structures of a data set, it is important to understand the relationship between variables such as genes or neurons. As a measure of such relationship, causality is to find directed relations between the variables, which can reveal more of the structures than undirected relations. As a quantitative measure of such causal relationship, transfer entropy has been proposed and successfully applied to capture the amount of information flow between events and sequences. In order to analyze the flow locally in time, we propose to localize normalized transfer entropy and regularize it to avoid the unstable result. Experiment results with synthetic and real-world data confirm the usefulness of our algorithm.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
,