Muutke küpsiste eelistusi

Introduction to Time Series Modeling with Applications in R: with Applications in R 2nd edition [Kõva köide]

(Institute of Statistical Mathematics, Tokyo, Japan)

Praise for the first edition:

[ This book] reflects the extensive experience and significant contributions of the author to non-linear and non-Gaussian modeling. … [ It] is a valuable book, especially with its broad and accessible introduction of models in the state space framework.

Statistics in Medicine

What distinguishes this book from comparable introductory texts is the use of state space modeling. Along with this come a number of valuable tools for recursive filtering and smoothing including the Kalman filter, as well as non-Gaussian and sequential Monte Carlo filters

MAA Reviews

Introduction to Time Series Modeling: with Applications in R, Second Edition covers numerous stationary and nonstationary time series models and tools for estimating and utilizing them. The goal of this book is to enable readers to build their own models to understand, predict and master time series. The second edition makes it possible for readers to reproduce examples in this book by using the freely-available R package TSSS to perform computations for their own real-world time series problems.

This book employs the state-space model as a generic tool for time series modeling and presents the Kalman filter, the non-Gaussian filter and the particle filter as convenient tools for recursive estimation for state-space models. Further, it also takes a unified approach based on the entropy maximization principle and employs various methods of parameter estimation and model selection, including the least squares method, the maximum likelihood method, recursive estimation for state-space models and model selection by AIC.

Along with the standard stationary time series models, such as the AR and ARMA models, the book also introduces nonstationary time series models such as the locally stationary AR model, the trend model, the seasonal adjustment model, the time-varying coefficient AR model and nonlinear non-Gaussian state-space models.

 

About the Author

Genshiro Kitagawa is a project professor at the University of Tokyo, the former Director-General of the Institute of Statistical Mathematics, and the former President of the Research Organization of Information and Systems.

Preface xi
Preface for Second Edition xiii
R and the Time Series Modeling Package TSSS xv
1 Introduction and Preparatory Analysis
1(18)
1.1 Time Series Data
1(5)
1.2 Classification of Time Series
6(3)
1.3 Objectives of Time Series Analysis
9(1)
1.4 Prc-Processing of Time Series
9(8)
1.4.1 Transformation of variables
10(1)
1.4.2 Differencing
11(1)
1.4.3 Month-to-month basis and year-over-year
12(2)
1.4.4 Moving average
14(3)
1.5 Organization of This Book
17(2)
2 The Covariance Function
19(16)
2.1 The Distribution of Time Series and Stationarity
19(3)
2.2 The Autocovariance Function of Stationary Time Series
22(1)
2.3 Estimation of the Autocovariance and Autocorrelation Functions
23(3)
2.4 Multivariate Time Series and Scatterplots
26(3)
2.5 Cross-Covariance Function and Cross-Correlation Function
29(6)
3 The Power Spectrum and the Periodogram
35(20)
3.1 The Power Spectrum
35(5)
3.2 The Periodogram
40(4)
3.3 Averaging and Smoothing of the Periodogram
44(4)
3.4 Computational Method of Periodogram
48(1)
3.5 Computation of the Periodogram by Fast Fourier Transform
49(6)
4 Statistical Modeling
55(24)
4.1 Probability Distributions and Statistical Models
55(5)
4.2 K-L Information and Entropy Maximization Principle
60(4)
4.3 Estimation of the K-L Information and the Log-Likelihood
64(1)
4.4 Estimation of Parameters by the Maximum Likelihood Method
65(4)
4.5 AIC (Akaike Information Criterion)
69(4)
4.5.1 Evaluation of C1
71(1)
4.5.2 Evaluation of C3
72(1)
4.5.3 Evaluation of C2
72(1)
4.5.4 Evaluation of C and AIC
73(1)
4.6 Transformation of Data
73(6)
5 The Least Squares Method
79(12)
5.1 Regression Models and the Least Squares Method
79(2)
5.2 The Least Squares Method Based on the Householder Transformation
81(2)
5.3 Selection of Order by AIC
83(4)
5.4 Addition of Data and Successive Householder Reduction
87(1)
5.5 Variable Selection by AIC
88(3)
6 Analysis of Time Series Using ARMA Models
91(22)
6.1 ARMA Model
91(1)
6.2 The Impulse Response Function
92(2)
6.3 The Autocovariance Function
94(2)
6.4 The Relation Between AR Coefficients and PARCOR
96(1)
6.5 The Power Spectrum of the ARMA Process
96(4)
6.6 The Characteristic Equation
100(4)
6.7 The Multivariate AR Model
104(9)
7 Estimation of an AR Model
113(24)
7.1 Fitting an AR Model
113(2)
7.2 Yule-Walker Method and Levinson's Algorithm
115(1)
7.3 Estimation of an AR Model by the Least Squares Method
116(2)
7.4 Estimation of an AR Model by the FARCOR Method
118(3)
7.5 Large Sample Distribution of the Estimates
121(3)
7.6 Estimation of Multivariate AR Model by the Yule-Walker Method
124(5)
7.7 Estimation of Multivariate AR Model by the Least Squares Method
129(8)
8 The Locally Stationary AR Model
137(16)
8.1 Locally Stationary AR Model
137(2)
8.2 Automatic Partitioning of the Time Interval into an Arbitrary Number of Subintervals
139(5)
8.3 Precise Estimation of the Change Point
144(5)
8.4 Posterior Probability of the Change Point
149(4)
9 Analysis of Time Series with a State-Space Model
153(18)
9.1 The State-Space Model
153(3)
9.2 State Estimation via the Kalman Filter
156(2)
9.3 Smoothing Algorithms
158(1)
9.4 Long-Term Prediction of the State
158(1)
9.5 Prediction of Time Series
159(4)
9.6 Likelihood Computation and Parameter Estimation for Time Series Models
163(3)
9.7 Interpolation of Missing Observations
166(5)
10 Estimation of the ARMA Model
171(10)
10.1 State-Space Representation of the ARMA Model
171(1)
10.2 Initial State Distribution for an AR Model
172(2)
10.3 Initial State Distribution of an ARMA Model
174(1)
10.4 Maximum Likelihood Estimates of an ARMA Model
175(3)
10.5 Initial Estimates of Parameters
178(3)
11 Estimation of Trends
181(14)
11.1 The Polynomial Trend Model
181(3)
11.2 Trend Component Model - Model for Gradual Changes
184(4)
11.3 Trend Model
188(7)
12 The Seasonal Adjustment Model
195(18)
12.1 Seasonal Component Model
195(3)
12.2 Standard Seasonal Adjustment Model
198(3)
12.3 Decomposition Including a Stationary AR Component
201(5)
12.4 Decomposition Including a Trading-Day Effect
206(7)
13 Time-Varying Coefficient AR Model
213(16)
13.1 Time-Varying Variance Model
213(4)
13.2 Time-Varying Coefficient AR Model
217(5)
13.3 Estimation of the Time-Varying Spectrum
222(2)
13.4 The Assumption on System Noise for the Time-Varying Coefficient AR Model
224(1)
13.5 Abrupt Changes of Coefficients
225(4)
14 Non-Gaussian State-Space Model
229(20)
14.1 Necessity of Non-Gaussian Models
229(1)
14.2 Non-Gaussian State-Space Models and State Estimation
230(2)
14.3 Numerical Computation of the State Estimation Formula
232(3)
14.4 Non-Gaussian Trend Model
235(5)
14.5 Non-symmetric Distribution - A Time-Varying Variance Model
240(4)
14.6 Applications of the Non-Gaussian State-Space Model
244(5)
14.6.1 Processing of the outliers by a mixture of Gaussian distributions
245(1)
14.6.2 A nonstationary discrete process
245(1)
14.6.3 A direct method of estimating the time-varying variance
246(1)
14.6.4 Nonlinear state-space models
247(2)
15 Particle Filter
249(18)
15.1 The Nonlinear Non-Gaussian Slate-Space Model and Approximations of Distributions
249(4)
15.2 Particle Filter
253(6)
15.2.1 One-step-ahead prediction
253(1)
15.2.2 Filtering
253(1)
15.2.3 Algorithm for the particle filter
254(1)
15.2.4 Likelihood of a model
254(1)
15.2.5 On the re-sampling method
255(1)
15.2.6 Numerical examples
256(3)
15.3 Particle Smoothing Method
259(3)
15.4 Nonlinear Smoothing
262(5)
16 Simulation
267(44)
16.1 Generation of Uniform Random Numbers
267(2)
16.2 Generation of White Noise
269(4)
16.2.1 x2 distribution
272(1)
16.2.2 Cauchy distribution
272(1)
16.2.3 Arbitrary distribution
272(1)
16.3 Simulation of ARMA models
273(2)
16.4 Simulation Using a State-Space Model
275(4)
16.5 Simulation with the Non-Gaussian State-Space Model
279(4)
A Algorithms for Nonlinear Optimization
283(2)
B Derivation of Levinson's Algorithm
285(4)
C Derivation of the Kalman Filter and Smoother Algorithms
289(4)
C.1 Kalman Filter
289(1)
C.2 Smoothing
290(3)
D Algorithm for the Particle Filter
293(18)
D.1 One-Step-Ahead Prediction
293(1)
D.2 Filter
294(1)
D.3 Smoothing
295(16)
Bibliography 311(8)
Index 319
Genshiro Kitagawa is a project professor at the University of Tokyo, the former Director-General of the Institute of Statistical Mathematics, and the former President of the Research Organization of Information and Systems.