|
|
1 | (10) |
|
|
2 | (1) |
|
|
2 | (7) |
|
|
3 | (1) |
|
|
4 | (1) |
|
|
5 | (1) |
|
|
5 | (2) |
|
1.2.5 Random Boolean Networks |
|
|
7 | (1) |
|
|
7 | (2) |
|
1.3 Information Flow and Causality |
|
|
9 | (1) |
|
|
10 | (1) |
|
|
10 | (1) |
|
2 Statistical Preliminaries |
|
|
11 | (22) |
|
|
12 | (1) |
|
2.2 Discrete Probabilities |
|
|
13 | (1) |
|
2.3 Conditional, Independent and Joint Probabilities |
|
|
14 | (6) |
|
2.3.1 Conditional Probabilities |
|
|
14 | (1) |
|
2.3.2 Independent Probabilities |
|
|
14 | (1) |
|
2.3.3 Joint Probabilities |
|
|
15 | (1) |
|
2.3.4 Conditional Independence |
|
|
16 | (1) |
|
2.3.5 Time-Series Data and Embedding Dimensions |
|
|
17 | (1) |
|
2.3.6 Conditional Independence and Markov Processes |
|
|
18 | (2) |
|
2.3.7 Vector Autoregression |
|
|
20 | (1) |
|
2.4 Statistical Expectations, Moments and Correlations |
|
|
20 | (2) |
|
2.5 Probability Distributions |
|
|
22 | (6) |
|
2.5.1 Binomial Distribution |
|
|
22 | (1) |
|
2.5.2 Poisson Distribution |
|
|
23 | (1) |
|
2.5.3 Continuous Probabilities |
|
|
24 | (1) |
|
2.5.4 Gaussian Distribution |
|
|
25 | (1) |
|
2.5.5 Multivariate Gaussian Distribution |
|
|
25 | (3) |
|
2.6 Symmetry and Symmetry Breaking |
|
|
28 | (5) |
|
|
33 | (32) |
|
|
33 | (2) |
|
|
35 | (16) |
|
3.2.1 Entropy and Information |
|
|
35 | (3) |
|
|
38 | (4) |
|
3.2.3 Conditional Mutual Information |
|
|
42 | (1) |
|
3.2.4 Kullback--Leibler Divergence |
|
|
43 | (2) |
|
3.2.5 Entropy of Continuous Processes |
|
|
45 | (5) |
|
3.2.6 Entropy and Kolmogorov Complexity |
|
|
50 | (1) |
|
3.2.7 Historical Note: Mutual Information and Communication |
|
|
50 | (1) |
|
3.3 Mutual Information and Phase Transitions |
|
|
51 | (1) |
|
|
52 | (13) |
|
3.4.1 Calculating Entropy |
|
|
53 | (6) |
|
3.4.2 Calculating Mutual Information |
|
|
59 | (4) |
|
3.4.3 The Non-stationary Case |
|
|
63 | (2) |
|
|
65 | (32) |
|
|
65 | (1) |
|
4.2 Definition of Transfer Entropy |
|
|
66 | (12) |
|
4.2.1 Determination of History Lengths |
|
|
69 | (3) |
|
4.2.2 Computational Interpretation as Information Transfer |
|
|
72 | (2) |
|
4.2.3 Conditional Transfer Entropy |
|
|
74 | (3) |
|
|
77 | (1) |
|
4.2.5 Local Transfer Entropy |
|
|
77 | (1) |
|
4.3 Transfer Entropy Estimators |
|
|
78 | (4) |
|
4.3.1 KSG Estimation for Transfer Entropy |
|
|
79 | (1) |
|
4.3.2 Symbolic Transfer Entropy |
|
|
80 | (1) |
|
4.3.3 Open-Source Transfer Entropy Software |
|
|
81 | (1) |
|
4.4 Relationship with Wiener--Granger Causality |
|
|
82 | (8) |
|
4.4.1 Granger Causality Captures Causality as Predictive of Effect |
|
|
83 | (1) |
|
4.4.2 Definition of Granger Causality |
|
|
83 | (3) |
|
4.4.3 Maximum-Likelihood Estimation of Granger Causality |
|
|
86 | (2) |
|
4.4.4 Granger Causality Versus Transfer Entropy |
|
|
88 | (2) |
|
4.5 Comparing Transfer Entropy Values |
|
|
90 | (2) |
|
4.5.1 Statistical Significance |
|
|
90 | (1) |
|
4.5.2 Normalising Transfer Entropy |
|
|
91 | (1) |
|
4.6 Information Transfer Density and Phase Transitions |
|
|
92 | (1) |
|
4.7 Continuous-Time Processes |
|
|
93 | (4) |
|
5 Information Transfer in Canonical Systems |
|
|
97 | (28) |
|
|
98 | (6) |
|
|
104 | (2) |
|
5.3 Random Boolean Networks |
|
|
106 | (5) |
|
|
111 | (4) |
|
|
115 | (4) |
|
5.6 Synchronisation Processes |
|
|
119 | (3) |
|
|
122 | (3) |
|
6 Information Transfer in Financial Markets |
|
|
125 | (14) |
|
6.1 Introduction to Financial Markets |
|
|
126 | (2) |
|
6.2 Information Theory Applied to Financial Markets |
|
|
128 | (2) |
|
6.2.1 Entropy and Economic Diversity: an Early Ecology of Economics |
|
|
128 | (1) |
|
6.2.2 Maximum Entropy: Maximum Diversity? |
|
|
129 | (1) |
|
6.2.3 Mutual Information: Phase Transitions and Market Crashes |
|
|
129 | (1) |
|
6.3 Information Transferred from One Market Index to Another |
|
|
130 | (3) |
|
6.4 From Indices to Equities and from Equities to Indices |
|
|
133 | (2) |
|
6.4.1 Economics of Beauty Pageants |
|
|
134 | (1) |
|
6.5 The Internal Economy and Its Place in the Global Economy |
|
|
135 | (4) |
|
7 Miscellaneous Applications of Transfer Entropy |
|
|
139 | (28) |
|
7.1 Information Transfer in Physiological Data |
|
|
139 | (4) |
|
7.2 Effective Network Inference |
|
|
143 | (6) |
|
7.2.1 Standard Pairwise TE Approach for Effective Network Inference |
|
|
144 | (1) |
|
7.2.2 Addressing Redundancy and Synergy in the Data |
|
|
145 | (3) |
|
7.2.3 Applications of Effective Network Inference |
|
|
148 | (1) |
|
7.3 Applications in Neuroscience |
|
|
149 | (4) |
|
7.3.1 TE for Pulse Sequences |
|
|
149 | (2) |
|
7.3.2 Direct TE Estimation Between Spiking Neurons |
|
|
151 | (1) |
|
7.3.3 TE in Brain Imaging |
|
|
152 | (1) |
|
7.4 Information Transfer in Biochemical Networks |
|
|
153 | (4) |
|
7.5 Information Transfer in Embodied Cognitive Systems |
|
|
157 | (5) |
|
7.6 Information Transfer in Social Media |
|
|
162 | (2) |
|
|
164 | (3) |
|
|
167 | (20) |
|
|
167 | (2) |
|
8.1.1 Non-parametric Estimation |
|
|
167 | (1) |
|
8.1.2 Parametric Estimation |
|
|
168 | (1) |
|
8.1.3 Non-stationary Systems |
|
|
169 | (1) |
|
8.2 Systems with Many Variables |
|
|
169 | (1) |
|
8.3 Touching the Void: the Link to Thermodynamics |
|
|
170 | (17) |
|
|
171 | (16) |
Index |
|
187 | |