Preface |
|
xiii | |
Acknowledgments |
|
xv | |
About this book |
|
xvii | |
About the author |
|
xix | |
About the cover |
|
xx | |
|
PART 1 YOUR MACHINE-LEARNING RIG |
|
|
1 | (50) |
|
1 A machine-learning odyssey |
|
|
3 | (22) |
|
1.1 Machine-learning fundamentals |
|
|
5 | (4) |
|
|
7 | (1) |
|
|
8 | (1) |
|
1.2 Data representation and features |
|
|
9 | (6) |
|
|
15 | (2) |
|
|
17 | (4) |
|
|
17 | (2) |
|
|
19 | (1) |
|
|
19 | (2) |
|
|
21 | (1) |
|
1.6 Overview of future chapters |
|
|
22 | (2) |
|
|
24 | (1) |
|
|
25 | (26) |
|
2.1 Ensuring that TensorFlow works |
|
|
27 | (1) |
|
|
28 | (4) |
|
|
32 | (2) |
|
2.4 Executing operators with sessions |
|
|
34 | (4) |
|
Understanding code as a graph |
|
|
35 | (1) |
|
Setting session configurations |
|
|
36 | (2) |
|
2.5 Writing code in Jupyter |
|
|
38 | (3) |
|
|
41 | (2) |
|
2.7 Saving and loading variables |
|
|
43 | (1) |
|
2.8 Visualizing data using TensorBoard |
|
|
44 | (5) |
|
Implementing a moving average |
|
|
44 | (2) |
|
Visualizing the moving average |
|
|
46 | (3) |
|
|
49 | (2) |
|
PART 2 CORE LEARNING ALGORITHMS |
|
|
51 | (82) |
|
3 Linear regression and beyond |
|
|
53 | (18) |
|
|
54 | (5) |
|
How do you know the regression algorithm is working'? |
|
|
57 | (2) |
|
|
59 | (3) |
|
|
62 | (3) |
|
|
65 | (4) |
|
3.5 Application of linear regression |
|
|
69 | (1) |
|
|
70 | (1) |
|
4 A gentle introduction to classification |
|
|
71 | (28) |
|
|
73 | (2) |
|
4.2 Measuring performance |
|
|
75 | (3) |
|
|
75 | (1) |
|
|
76 | (1) |
|
Receiver operating characteristic curve |
|
|
77 | (1) |
|
4.3 Using linear regression for classification |
|
|
78 | (5) |
|
4.4 Using logistic regression |
|
|
83 | (7) |
|
Solving one-dimensional logistic regression |
|
|
84 | (3) |
|
Solving two-dimensional logistic regression |
|
|
87 | (3) |
|
4.5 Multiclass classifier |
|
|
90 | (6) |
|
|
91 | (1) |
|
|
92 | (1) |
|
|
92 | (4) |
|
4.6 Application of classification |
|
|
96 | (1) |
|
|
97 | (2) |
|
5 Automatically clustering data |
|
|
99 | (20) |
|
5.1 Traversing files in TensorFlow |
|
|
100 | (2) |
|
5.2 Extracting features from audio |
|
|
102 | (4) |
|
|
106 | (3) |
|
|
109 | (3) |
|
5.5 Clustering using a self-organizing map |
|
|
112 | (5) |
|
5.6 Application of clustering |
|
|
117 | (1) |
|
|
117 | (2) |
|
|
119 | (14) |
|
6.1 Example of a not-so-interpretable model |
|
|
121 | (1) |
|
|
121 | (3) |
|
|
124 | (1) |
|
|
125 | (3) |
|
|
128 | (2) |
|
6.6 Uses of hidden Markov models |
|
|
130 | (1) |
|
|
130 | (1) |
|
|
130 | (1) |
|
|
130 | (1) |
|
6.7 Application of hidden Markov models |
|
|
130 | (1) |
|
|
131 | (2) |
|
PART 3 THE NEURAL NETWORK PARADIGM |
|
|
133 | (108) |
|
7 A peek into autoencoders |
|
|
135 | (18) |
|
|
136 | (4) |
|
|
140 | (5) |
|
|
145 | (1) |
|
|
146 | (4) |
|
7.5 Application of autoencoders |
|
|
150 | (1) |
|
|
151 | (2) |
|
|
153 | (16) |
|
|
155 | (3) |
|
|
156 | (1) |
|
|
157 | (1) |
|
8.2 Applying reinforcement learning |
|
|
158 | (2) |
|
8.3 Implementing reinforcement learning |
|
|
160 | (7) |
|
8.4 Exploring other applications of reinforcement learning |
|
|
167 | (1) |
|
|
168 | (1) |
|
9 Convolutional neural networks |
|
|
169 | (20) |
|
9.1 Drawback of neural networks |
|
|
170 | (1) |
|
9.2 Convolutional neural networks |
|
|
171 | (2) |
|
|
173 | (9) |
|
|
176 | (2) |
|
|
178 | (3) |
|
|
181 | (1) |
|
9.4 Implementing a convolutional neural network in TensorFlow |
|
|
182 | (5) |
|
|
185 | (1) |
|
|
186 | (1) |
|
9.5 Tips and tricks to improve performance |
|
|
187 | (1) |
|
9.6 Application of convolutional neural networks |
|
|
188 | (1) |
|
|
188 | (1) |
|
10 Recurrent neural networks |
|
|
189 | (12) |
|
10.1 Contextual information |
|
|
190 | (1) |
|
10.2 Introduction to recurrent neural networks |
|
|
190 | (2) |
|
10.3 Implementing a recurrent neural network |
|
|
192 | (3) |
|
10.4 A predictive model for time-series data |
|
|
195 | (3) |
|
10.5 Application of recurrent neural networks |
|
|
198 | (1) |
|
|
199 | (2) |
|
11 Sequence-to-sequence models for chatbots |
|
|
201 | (22) |
|
11.1 Building on classification and RNNs |
|
|
202 | (3) |
|
11.2 Seq2seq architecture |
|
|
205 | (5) |
|
11.3 Vector representation of symbols |
|
|
210 | (2) |
|
11.4 Putting it all together |
|
|
212 | (8) |
|
11.5 Gathering dialogue data |
|
|
220 | (2) |
|
|
222 | (1) |
|
|
223 | (18) |
|
|
226 | (5) |
|
|
231 | (3) |
|
|
234 | (5) |
|
|
239 | (1) |
|
|
239 | (2) |
Appendix Installation |
|
241 | (6) |
Index |
|
247 | |