Preface |
|
xvii | |
Acknowledgments |
|
xix | |
About this book |
|
xx | |
About the author |
|
xxiv | |
About the cover illustration |
|
xxv | |
|
Part 1 Time waits for no one |
|
|
1 | (58) |
|
1 Understanding time series forecasting |
|
|
3 | (11) |
|
1.1 Introducing time series |
|
|
4 | (4) |
|
Components of a time series |
|
|
5 | (3) |
|
1.2 Bird's-eye view of time series forecasting |
|
|
8 | (4) |
|
|
9 | (1) |
|
Determining what must be forecast to achieve your goal |
|
|
9 | (1) |
|
Setting the horizon of the forecast |
|
|
10 | (1) |
|
|
10 | (1) |
|
Developing a forecasting model |
|
|
10 | (1) |
|
|
11 | (1) |
|
|
11 | (1) |
|
|
11 | (1) |
|
1.3 How time series forecasting is different from other regression tasks |
|
|
12 | (1) |
|
Time series have an order |
|
|
12 | (1) |
|
Time series sometimes do not have features |
|
|
13 | (1) |
|
|
13 | (1) |
|
2 A naive prediction of the future |
|
|
14 | (16) |
|
2.1 Defining a baseline model |
|
|
16 | (1) |
|
2.2 Forecasting the historical mean |
|
|
17 | (6) |
|
Setup for baseline implementations |
|
|
17 | (2) |
|
Implementing the historical mean baseline |
|
|
19 | (4) |
|
2.3 Forecasting last year's mean |
|
|
23 | (2) |
|
2.4 Predicting using the last known value |
|
|
25 | (1) |
|
2.5 Implementing the naive seasonal forecast |
|
|
26 | (2) |
|
|
28 | (2) |
|
|
30 | (29) |
|
3.1 The random walk process |
|
|
31 | (4) |
|
Simulating a random walk process |
|
|
32 | (3) |
|
3.2 Identifying a random walk |
|
|
35 | (12) |
|
|
36 | (2) |
|
|
38 | (3) |
|
The autocorrelation function |
|
|
41 | (1) |
|
|
42 | (3) |
|
|
45 | (2) |
|
3.3 Forecasting a random walk |
|
|
47 | (8) |
|
Forecasting on a long horizon |
|
|
48 | (4) |
|
Forecasting the next timestep |
|
|
52 | (3) |
|
|
55 | (1) |
|
|
56 | (3) |
|
Simulate and forecast a random walk |
|
|
56 | (1) |
|
Forecast the daily closing price of GOOGL |
|
|
57 | (1) |
|
Forecast the daily closing price of a stock of your choice |
|
|
57 | (2) |
|
Part 2 Forecasting with statistical models |
|
|
59 | (172) |
|
4 Modeling a moving average process |
|
|
61 | (20) |
|
4.1 Defining a moving average process |
|
|
63 | (6) |
|
Identifying the order of a moving average process |
|
|
64 | (5) |
|
4.2 Forecasting a moving average process |
|
|
69 | (9) |
|
|
78 | (1) |
|
|
79 | (2) |
|
Simulate an MA(2) process and make forecasts |
|
|
79 | (1) |
|
Simulate an MA(q) process and make forecasts |
|
|
80 | (1) |
|
5 Modeling an autoregressive process |
|
|
81 | (20) |
|
5.1 Predicting the average weekly foot traffic in a retail store |
|
|
82 | (2) |
|
5.2 Defining the autoregressive process |
|
|
84 | (1) |
|
5.3 Finding the order of a stationary autoregressive process |
|
|
85 | (7) |
|
The partial autocorrelation function (PACF) |
|
|
89 | (3) |
|
5.4 Forecasting an autoregressive process |
|
|
92 | (6) |
|
|
98 | (1) |
|
|
99 | (2) |
|
Simulate an AR(2) process and make forecasts |
|
|
99 | (1) |
|
Simulate an AR(p) process and make forecasts |
|
|
100 | (1) |
|
6 Modeling complex time series |
|
|
101 | (39) |
|
6.1 Forecasting bandwidth usage for data centers |
|
|
102 | (3) |
|
6.2 Examining the autoregressive moving average process |
|
|
105 | (1) |
|
6.3 Identifying a stationary ARMA process |
|
|
106 | (5) |
|
6.4 Devising a general modeling procedure |
|
|
111 | (14) |
|
Understanding the Akaike information criterion (AIC) |
|
|
113 | (1) |
|
Selecting a model using the AIC |
|
|
114 | (2) |
|
Understanding residual analysis |
|
|
116 | (5) |
|
Performing residual analysis |
|
|
121 | (4) |
|
6.5 Applying the general modeling procedure |
|
|
125 | (7) |
|
6.6 Forecasting bandwidth usage |
|
|
132 | (4) |
|
|
136 | (1) |
|
|
137 | (3) |
|
Make predictions on the simulated ARMA(1, 1) process |
|
|
137 | (1) |
|
Simulate an ARMA(2, 2) process and make forecasts |
|
|
137 | (3) |
|
7 Forecasting non-stationary time series |
|
|
140 | (16) |
|
7.1 Defining the autoregressive integrated moving average model |
|
|
142 | (1) |
|
7.2 Modifying the general modeling procedure to account for non-stationary series |
|
|
143 | (2) |
|
7.3 Forecasting a non-stationary times series |
|
|
145 | (9) |
|
|
154 | (1) |
|
|
154 | (2) |
|
Apply the ARIMA(p, d, q) model on the datasets from chapters 4, 5, and 6 |
|
|
154 | (2) |
|
8 Accounting for seasonality |
|
|
156 | (24) |
|
8.1 Examining the SARIMA(p, d, q) (P, D, Q)m model |
|
|
157 | (3) |
|
8.2 Identifying seasonal patterns in a time series |
|
|
160 | (3) |
|
8.3 Forecasting the number of monthly air passengers |
|
|
163 | (15) |
|
Forecasting with an ARIMA(p, d, q) model |
|
|
165 | (6) |
|
Forecasting with a SARIMA(p, d, q)(P, D, Q)m model |
|
|
171 | (5) |
|
Comparing the performance of each forecasting method |
|
|
176 | (2) |
|
|
178 | (1) |
|
|
178 | (2) |
|
Apply the SAPJMA(p, d, q)(P, D, Q)m model on the Johnson & Johnson dataset |
|
|
178 | (2) |
|
9 Adding external variables to our model |
|
|
180 | (17) |
|
9.1 Examining the SARIMAX model |
|
|
182 | (4) |
|
Exploring the exogenous variables of the US macroeconomics dataset |
|
|
183 | (2) |
|
|
185 | (1) |
|
9.2 Forecasting the real GDP using the SARIMAX model |
|
|
186 | (9) |
|
|
195 | (1) |
|
|
195 | (2) |
|
Use all exogenous variables in a SARIMAX model to predict the real GDP |
|
|
195 | (2) |
|
10 Forecasting multiple time series |
|
|
197 | (19) |
|
10.1 Examining the VAR model |
|
|
199 | (2) |
|
10.2 Designing a modeling procedure for the VAR(p) model |
|
|
201 | (2) |
|
Exploring the Granger causality test |
|
|
201 | (2) |
|
10.3 Forecasting real disposable income and real consumption |
|
|
203 | (11) |
|
|
214 | (1) |
|
|
214 | (2) |
|
Use a VARMA model to predict realdpi and realcons |
|
|
214 | (1) |
|
Use a VARMAX model to predict realdpi and realcons |
|
|
215 | (1) |
|
11 Capstone: Forecasting the number of antidiabetic drug prescriptions in Australia |
|
|
216 | (15) |
|
11.1 Importing the required libraries and loading the data |
|
|
218 | (1) |
|
11.2 Visualizing the series and its components |
|
|
219 | (1) |
|
|
220 | (5) |
|
Performing model selection |
|
|
222 | (2) |
|
Conducting residual analysis |
|
|
224 | (1) |
|
11.4 Forecasting and evaluating the model's performance |
|
|
225 | (4) |
|
|
229 | (2) |
|
Part 3 Large-scale forecasting with deep learning |
|
|
231 | (128) |
|
Introducing deep learning for time series forecasting |
|
|
233 | (1) |
|
12.1 When to use deep learning for time series forecasting |
|
|
234 | (1) |
|
12.2 Exploring the different types of deep learning models |
|
|
234 | (3) |
|
12.3 Getting ready to apply deep learning for forecasting |
|
|
237 | (9) |
|
Performing data exploration |
|
|
237 | (4) |
|
Feature engineering and data splitting |
|
|
241 | (5) |
|
|
246 | (1) |
|
|
246 | (2) |
|
13 Data windowing and creating baselines for deep learning |
|
|
248 | (22) |
|
13.1 Creating windows of data |
|
|
249 | (11) |
|
Exploring how deep learning models are trained for time series forecasting |
|
|
249 | (4) |
|
Implementing the Data Window class |
|
|
253 | (7) |
|
13.2 Applying baseline models |
|
|
260 | (8) |
|
Single-step baseline model |
|
|
260 | (3) |
|
Multi-step baseline models |
|
|
263 | (3) |
|
Multi-output baseline model |
|
|
266 | (2) |
|
|
268 | (1) |
|
|
269 | (1) |
|
14 Baby steps with deep learning |
|
|
270 | (17) |
|
14.1 Implementing a linear model |
|
|
271 | (5) |
|
Implementing a single-step linear model |
|
|
272 | (2) |
|
Implementing a multi-step linear model |
|
|
274 | (1) |
|
Implementing a multi-output linear model |
|
|
275 | (1) |
|
14.2 Implementing a deep neural network |
|
|
276 | (8) |
|
Implementing a deep neural network as a single-step model |
|
|
278 | (3) |
|
Implementing a deep neural network as a multi-step model |
|
|
281 | (1) |
|
Implementing a deep neural network as a multi-output model |
|
|
282 | (2) |
|
|
284 | (1) |
|
|
285 | (2) |
|
15 Remembering the past with LSTM |
|
|
287 | (18) |
|
15.1 Exploring the recurrent neural network (RNN) |
|
|
288 | (2) |
|
15.2 Examining the LSTM architecture |
|
|
290 | (5) |
|
|
291 | (1) |
|
|
292 | (2) |
|
|
294 | (1) |
|
15.3 Implementing the LSTM architecture |
|
|
295 | (7) |
|
Implementing an LSTM as a single-step model |
|
|
295 | (2) |
|
Implementing an LSTM as a multi-step model |
|
|
297 | (2) |
|
Implementing an LSTM as a multi-output model |
|
|
299 | (3) |
|
|
302 | (1) |
|
|
303 | (2) |
|
16 Filtering a time series with CNN |
|
|
305 | (15) |
|
16.1 Examining the convolutional neural network (CNN) |
|
|
306 | (3) |
|
|
309 | (8) |
|
Implementing a CNN as a single-step model |
|
|
310 | (4) |
|
Implementing a CNN as a multi-step model |
|
|
314 | (1) |
|
Implementing a CNN as a multi-output model |
|
|
315 | (2) |
|
|
317 | (1) |
|
|
318 | (2) |
|
17 Using predictions to make more predictions |
|
|
320 | (9) |
|
17.1 Examining the ARLSTM architecture |
|
|
321 | (1) |
|
17.2 Building an autoregressive LSTM model |
|
|
322 | (5) |
|
|
327 | (1) |
|
|
328 | (1) |
|
18 Capstone: Forecasting the electric power consumption of a household |
|
|
329 | (30) |
|
18.1 Understanding the capstone project |
|
|
330 | (3) |
|
Objective of this capstone project |
|
|
331 | (2) |
|
18.2 Data wrangling and preprocessing |
|
|
333 | (5) |
|
Dealing with missing data |
|
|
334 | (1) |
|
|
335 | (1) |
|
|
335 | (3) |
|
|
338 | (4) |
|
Removing unnecessary columns |
|
|
338 | (1) |
|
Identifying the seasonal period |
|
|
339 | (2) |
|
Splitting and scaling the data |
|
|
341 | (1) |
|
18.4 Preparing for modeling with deep learning |
|
|
342 | (4) |
|
|
342 | (1) |
|
Defining the DataWindow class |
|
|
343 | (3) |
|
Utility function to train our models |
|
|
346 | (1) |
|
18.5 Modeling with deep learning |
|
|
346 | (12) |
|
|
346 | (3) |
|
|
349 | (1) |
|
|
350 | (1) |
|
Long short-term memory (LSTM) model |
|
|
351 | (1) |
|
Convolutional neural network (CNN) |
|
|
351 | (3) |
|
Combining a CNN with an LSTM |
|
|
354 | (1) |
|
The autoregressive LSTM model |
|
|
355 | (1) |
|
|
356 | (2) |
|
|
358 | (1) |
|
Part 4 Automating forecasting at scale |
|
|
359 | (59) |
|
19 Automating time series forecasting with Prophet |
|
|
361 | (35) |
|
19.1 Overview of the automated forecasting libraries |
|
|
362 | (1) |
|
|
363 | (2) |
|
19.3 Basic forecasting with Prophet |
|
|
365 | (5) |
|
19.4 Exploring Prophet's advanced functionality |
|
|
370 | (11) |
|
Visualization capabilities |
|
|
370 | (4) |
|
Cross-validation and performance metrics |
|
|
374 | (5) |
|
|
379 | (2) |
|
19.5 Implementing a robust forecasting process with Prophet |
|
|
381 | (12) |
|
Forecasting project: Predicting the popularity of "chocolate" searches on Google |
|
|
381 | (8) |
|
Experiment: Can SARIMA do better? |
|
|
389 | (4) |
|
|
393 | (1) |
|
|
394 | (2) |
|
Forecast the number of air passengers |
|
|
394 | (1) |
|
Forecast the volume of antidiabetic drug prescriptions |
|
|
394 | (1) |
|
Forecast the popularity of a keyword on Google Trends |
|
|
394 | (2) |
|
20 Capstone: Forecasting the monthly average retail price of steak in Canada |
|
|
396 | (14) |
|
20.1 Understanding the capstone project |
|
|
397 | (1) |
|
Objective of the capstone project |
|
|
397 | (1) |
|
20.2 Data preprocessing and visualization |
|
|
398 | (2) |
|
20.3 Modeling with Prophet |
|
|
400 | (4) |
|
20.4 Optional: Develop a SARI MA model |
|
|
404 | (5) |
|
|
409 | (1) |
|
21 Going above and beyond |
|
|
410 | (3) |
|
21.1 Summarizing what you've learned |
|
|
411 | (2) |
|
Statistical methods for forecasting |
|
|
411 | (1) |
|
Deep learning methods for forecasting |
|
|
412 | (1) |
|
21 Automating the forecasting process |
|
|
413 | (5) |
|
21.2 What if forecasting does not work? |
|
|
413 | (2) |
|
21.3 Other applications of time series data |
|
|
415 | (1) |
|
|
416 | (2) |
Appendix Installation Instructions |
|
418 | (3) |
Index |
|
421 | |