Transfer Entropy as a Log-likelihood Ratio

Research output: Contribution to journalArticlepeer-review

76 Citations (Scopus)
23 Downloads (Pure)


Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic '2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.
Original languageEnglish
Pages (from-to)1-10
Number of pages10
JournalPhysical Review Letters
Issue number13
Publication statusPublished - Sep 2012


Dive into the research topics of 'Transfer Entropy as a Log-likelihood Ratio'. Together they form a unique fingerprint.

Cite this