Muutke küpsiste eelistusi

Weak Dependence: With Examples and Applications 2007 ed. [Pehme köide]

  • Formaat: Paperback / softback, 322 pages, kõrgus x laius: 235x155 mm, kaal: 516 g, XIV, 322 p., 1 Paperback / softback
  • Sari: Lecture Notes in Statistics 190
  • Ilmumisaeg: 18-Jul-2007
  • Kirjastus: Springer-Verlag New York Inc.
  • ISBN-10: 0387699511
  • ISBN-13: 9780387699516
Teised raamatud teemal:
  • Pehme köide
  • Hind: 113,55 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Tavahind: 133,59 €
  • Säästad 15%
  • Raamatu kohalejõudmiseks kirjastusest kulub orienteeruvalt 2-4 nädalat
  • Kogus:
  • Lisa ostukorvi
  • Tasuta tarne
  • Tellimisaeg 2-4 nädalat
  • Lisa soovinimekirja
  • Formaat: Paperback / softback, 322 pages, kõrgus x laius: 235x155 mm, kaal: 516 g, XIV, 322 p., 1 Paperback / softback
  • Sari: Lecture Notes in Statistics 190
  • Ilmumisaeg: 18-Jul-2007
  • Kirjastus: Springer-Verlag New York Inc.
  • ISBN-10: 0387699511
  • ISBN-13: 9780387699516
Teised raamatud teemal:
This book develops Doukhan/Louhichi's 1999 idea to measure asymptotic independence of a random process. The authors, who helped develop this theory, propose examples of models fitting such conditions: stable Markov chains, dynamical systems or more complicated models, nonlinear, non-Markovian, and heteroskedastic models with infinite memory. Applications are still needed to develop a method of analysis for nonlinear times series, and this book provides a strong basis for additional studies.

This monograph is aimed at developing Doukhan/Louhichi's (1999) idea to measure asymptotic independence of a random process. The authors propose various examples of models fitting such conditions such as stable Markov chains, dynamical systems or more complicated models, nonlinear, non-Markovian, and heteroskedastic models with infinite memory. Most of the commonly used stationary models fit their conditions. The simplicity of the conditions is also their strength.The main existing tools for an asymptotic theory are developed under weak dependence. They apply the theory to nonparametric statistics, spectral analysis, econometrics, and resampling. The level of generality makes those techniques quite robust with respect to the model. The limit theorems are sometimes sharp and always simple to apply.The theory (with proofs) is developed and the authors propose to fix the notation for future applications. A large number of research papers deals with the present ideas; the authors as well as numerous other investigators participated actively in the development of this theory. Several applications are still needed to develop a method of analysis for (nonlinear) times series and they provide here a strong basis for such studies.

Arvustused

From the reviews:









"I appreciate this book as a very welcome and thorough discussion of the actual state-of-the art in the modeling of dependence structures. It provides a large number of motivating examples and applications, rigorous proofs, and valuable intuitions for the willing and mathematically well-trained reader with essential prior knowledge of the mathematical prerequisites of weak dependence . It is the book to those researchers already aware of the necessity of the methods discussed here." (Harry Haupt, Advances in Statistical Analysis, Vol. 93, 2009)



"This book provides a detailed description of the notion of weak dependence as well as properties and applications. Overall the book is neatly written . the book is very rich in its material as it contains earlier works on dependence and show a lot of applications of the theory. It also contains a large number of examples and expositions of the idea of weak dependence in models which provide good insight." (Dimitris Karlis, Zentralblatt MATH, Vol. 1165, 2009)

Preface v
List of notations xiii
1 Introduction
1
1.1 From independence to dependence
1
1.2 Mixing
4
1.3 Mixingales and near epoch dependence
5
1.4 Association
6
1.5 Nonmixing models
8
2 Weak dependence
9
2.1 Function spaces
10
2.2 Weak dependence
11
2.2.1 η kappa, λ and ζ-coefficients
12
2.2.2 theta and τ-coefficients
14
2.2.3 α, β and φ-coefficients
16
2.2.4 Projective measure of dependence
19
3 Models
21
3.1 Bernoulli shifts
21
3.1.1 Volterra processes
22
3.1.2 Noncausal shifts with independent inputs
24
3.1.3 Noncausal shifts with dependent inputs
25
3.1.4 Causal shifts with independent inputs
31
3.1.5 Causal shifts with dependent inputs
32
3.2 Markov sequences
33
3.2.1 Contracting Markov chain.
35
3.2.2 Nonlinear AR(d) models
36
3.2.3 ARCH-type processes
36
3.2.4 Branching type models
37
3.3 Dynamical systems
38
3.4 Vector valued LARCH(infinity) processes
42
3.4.1 Chaotic expansion of LARCH(infinity) models
43
3.4.2 Bilinear models
48
3.5 ζ-dependent models
53
3.5.1 Associated processes
53
3.5.2 Gaussian processes
55
3.5.3 Interacting particle systems
56
3.6 Other models
59
3.6.1 Random AR models
59
3.6.2 Integer valued models
61
3.6.3 Random fields
62
3.6.4 Continuous time
65
4 Tools for non causal cases
67
4.1 Indicators of weakly dependent processes
67
4.2 Low order moments inequalities
69
4.2.1 Variances
69
4.2.2 A (2 + δ)-order momentbound
70
4.3 Combinatorial moment inequalities
73
4.3.1 Marcinkiewicz-Zygmundtype inequalities
77
4.3.2 Rosenthal type inequalities
79
4.3.3 A first exponential inequality
82
4.4 Cumulants
84
4.4.1 General properties of cumulants
84
4.4.2 A second exponential inequality
93
4.4.3 From weak dependence to the exponential bound
96
4.5 Tightness criteria
98
5 Tools for causal cases
103
5.1 Comparison results
103
5.2 Covariance inequalities
110
5.2.1 A covariance inequality for γ1
110
5.2.2 A covariance inequality for β and φ
111
5.3 Coupling
114
5.3.1 A coupling result for real valued random variables
115
5.3.2 Coupling in higher dimension
116
5.4 Exponential and moment inequalities
119
5.4.1 Bennett-type inequality
120
5.4.2 Burkholder's inequalities
123
5.4.3 Rosenthal inequalities using Rio techniques
125
5.4.4 Rosenthal inequalities for τ1-dependent sequences
130
5.4.5 Rosenthal inequalities under projective conditions
131
5.5 Maximal inequalities
132
6 Applications of SLLN
135
6.1 Stochastic algorithms with non causal dependent input
135
6.1.1 Weakly dependent noise
137
6.1.2 γ1-dependent noise
140
6.2 Examples of application
142
6.2.1 Robbins-Monro algorithm
142
6.2.2 Kiefer-Wolfowitz algorithm
143
6.3 Weighted dependent triangular arrays
143
6.4 Linear regression
145
7 Central limit theorem
153
7.1 Non causal case: stationary sequences
153
7.2 Lindeberg method
155
7.2.1 Proof of the main results
158
7.2.2 Rates of convergence
161
7.3 Non causal random fields
163
7.4 Conditional central limit theorem (causal)
173
7.4.1 Definitions and preliminary lemmas
174
7.4.2 Invariance of the conditional variance
176
7.4.3 End of the proof
178
7.5 Applications
182
7.5.1 Stable convergence
182
7.5.2 Sufficient conditions for stationary sequences
184
7.5.3 γ-dependent sequences
189
7.5.4 α and φ-dependent sequences
192
7.5.5 Sufficient conditions for triangular arrays
194
8 Donsker principles
199
8.1 Non causal stationary sequences
199
8.2 Non causal random fields
200
8.2.1 Moment inequality
201
8.2.2 Finite dimensional convergence
202
8.2.3 Tightness
205
8.3 Conditional (causal) invariance principle
205
8.3.1 Preliminaries
206
8.3.2 Finite dimensional convergence
207
8.3.3 Relative compactness
208
8.4 Applications
209
8.4.1 Sufficient conditions for stationary sequences
209
8.4.2 Sufficient conditions for triangular arrays
212
9 Law of the iterated logarithm (LIL)
213
9.1 Bounded LIL under a non causal condition
213
9.2 Causal strong invariance principle
214
10 The empirical process 223
10.1 A simple condition for the tightness
224
10.2 η-dependent sequences
225
10.3 α, β and φ-dependent sequences
231
10.4 theta and τ-dependent sequences
233
10.5 Empirical copula processes
234
10.6 Random fields
236
11 Functional estimation 247
11.1 Some non-parametric problems
247
11.2 Kernel regression estimates
248
11.2.1 Second order and CLT results
249
11.2.2 Almost sure convergence properties
252
11.3 MISE for β-dependent sequences
254
11.4 General kernels
260
12 Spectral estimation 265
12.1 Spectral densities
265
12.2 Periodogram
269
12.2.1 Whittle estimation
274
12.3 Spectral density estimation
275
12.3.1 Second order estimate
277
12.3.2 Dependence coefficients
279
13 Econometric applications and resampling 283
13.1 Econometrics
283
13.1 1 Unit root tests
284
13.1.2 Parametric problems
285
13.1.3 A semi-pazametric estimation problem
285
13.2 Bootstrap
287
13.2.1 Block bootstrap
288
13.2.2 Bootstrapping GMM estimators
288
13.2.3 Conditional bootstrap
290
13.2.4 Sieve bootstrap
290
13.3 Limit variance estimates
292
13.3.1 Moments, cumulants and weak dependence
293
13.3.2 Estimation of the limit variance
295
13.3.3 Law of the large numbers
297
13.3.4 Central limit theorem
299
13.3.5 A non centered variant
302
Bibliography 305
Index 317