Muutke küpsiste eelistusi

Machine Learning with TensorFlow [Pehme köide]

  • Formaat: Paperback / softback, 272 pages, kõrgus x laius x paksus: 235x190x16 mm, kaal: 454 g, Illustrations
  • Ilmumisaeg: 10-Apr-2018
  • Kirjastus: Manning Publications
  • ISBN-10: 1617293873
  • ISBN-13: 9781617293870
  • Formaat: Paperback / softback, 272 pages, kõrgus x laius x paksus: 235x190x16 mm, kaal: 454 g, Illustrations
  • Ilmumisaeg: 10-Apr-2018
  • Kirjastus: Manning Publications
  • ISBN-10: 1617293873
  • ISBN-13: 9781617293870
DESCRIPTION

Being able to make near-real-time decisions is becoming increasingly

crucial. To succeed, we need machine learning systems that can turn

massive amounts of data into valuable insights. But when you're just

starting out in the data science field, how do you get started creating

machine learning applications? The answer is TensorFlow, a new open

source machine learning library from Google. The TensorFlow library

can take your high level designs and turn them into the low level

mathematical operations required by machine learning algorithms.





Machine Learning with TensorFlow teaches readers about machine learning algorithms and how to implement solutions with TensorFlow.

It starts with an overview of machine learning concepts and moves on

to the essentials needed to begin using TensorFlow. Each chapter

zooms into a prominent example of machine learning. Readers can

cover them all to master the basics or skip around to cater to their

needs. By the end of this book, readers will be able to solve

classification, clustering, regression, and prediction problems in the

real world.





KEY FEATURES

Lots of diagrams, code examples, and exercises

Solves real-world problems with TensorFlow

Uses well-studied neural network architectures

Presents code that can be used for the readers own applications

AUDIENCE

This book is for programmers who have some experience with Python and

linear algebra concepts like vectors and matrices. No experience with

machine learning is necessary.

ABOUT THE TECHNOLOGY

Google open-sourced their machine learning framework called TensorFlow

in late 2015 under the Apache 2.0 license. Before that, it was used

proprietarily by Google in its speech recognition, Search, Photos, and

Gmail, among other applications. TensorFlow is one the most popular

machine learning libraries.
Preface xiii
Acknowledgments xv
About this book xvii
About the author xix
About the cover xx
PART 1 YOUR MACHINE-LEARNING RIG
1(50)
1 A machine-learning odyssey
3(22)
1.1 Machine-learning fundamentals
5(4)
Parameters
7(1)
Learning and inference
8(1)
1.2 Data representation and features
9(6)
1.3 Distance metrics
15(2)
1.4 Types of learning
17(4)
Supervised learning
17(2)
Unsupervised learning
19(1)
Reinforcement learning
19(2)
1.5 TensorFlow
21(1)
1.6 Overview of future chapters
22(2)
1.7 Summary
24(1)
2 TensorFlow essentials
25(26)
2.1 Ensuring that TensorFlow works
27(1)
2.2 Representing tensors
28(4)
2.3 Creating operators
32(2)
2.4 Executing operators with sessions
34(4)
Understanding code as a graph
35(1)
Setting session configurations
36(2)
2.5 Writing code in Jupyter
38(3)
2.6 Using variables
41(2)
2.7 Saving and loading variables
43(1)
2.8 Visualizing data using TensorBoard
44(5)
Implementing a moving average
44(2)
Visualizing the moving average
46(3)
2.9 Summary
49(2)
PART 2 CORE LEARNING ALGORITHMS
51(82)
3 Linear regression and beyond
53(18)
3.1 Formal notation
54(5)
How do you know the regression algorithm is working'?
57(2)
3.2 Linear regression
59(3)
3.3 Polynomial model
62(3)
3.4 Regularization
65(4)
3.5 Application of linear regression
69(1)
3.6 Summary
70(1)
4 A gentle introduction to classification
71(28)
4.1 Formal notation
73(2)
4.2 Measuring performance
75(3)
Accuracy
75(1)
Precision and recall
76(1)
Receiver operating characteristic curve
77(1)
4.3 Using linear regression for classification
78(5)
4.4 Using logistic regression
83(7)
Solving one-dimensional logistic regression
84(3)
Solving two-dimensional logistic regression
87(3)
4.5 Multiclass classifier
90(6)
One-versus-all
91(1)
One-versus-one
92(1)
Softmax regression
92(4)
4.6 Application of classification
96(1)
4.7 Summary
97(2)
5 Automatically clustering data
99(20)
5.1 Traversing files in TensorFlow
100(2)
5.2 Extracting features from audio
102(4)
5.3 K-means clustering
106(3)
5.4 Audio segmentation
109(3)
5.5 Clustering using a self-organizing map
112(5)
5.6 Application of clustering
117(1)
5.7 Summary
117(2)
6 Hidden Markov models
119(14)
6.1 Example of a not-so-interpretable model
121(1)
6.2 Markov model
121(3)
6.3 Hidden Markov model
124(1)
6.4 Forward algorithm
125(3)
6.5 Viterbi decoding
128(2)
6.6 Uses of hidden Markov models
130(1)
Modeling a video
130(1)
Modeling DNA
130(1)
Modeling an image
130(1)
6.7 Application of hidden Markov models
130(1)
6.8 Summary
131(2)
PART 3 THE NEURAL NETWORK PARADIGM
133(108)
7 A peek into autoencoders
135(18)
7.1 Neural networks
136(4)
7.2 Autoencoders
140(5)
7.3 Batch training
145(1)
7.4 Working with images
146(4)
7.5 Application of autoencoders
150(1)
7.6 Summary
151(2)
8 Reinforcement learning
153(16)
8.1 Formal notions
155(3)
Policy
156(1)
Utility
157(1)
8.2 Applying reinforcement learning
158(2)
8.3 Implementing reinforcement learning
160(7)
8.4 Exploring other applications of reinforcement learning
167(1)
8.5 Summary
168(1)
9 Convolutional neural networks
169(20)
9.1 Drawback of neural networks
170(1)
9.2 Convolutional neural networks
171(2)
9.3 Preparing the image
173(9)
Generating filters
176(2)
Convolving using filters
178(3)
Max pooling
181(1)
9.4 Implementing a convolutional neural network in TensorFlow
182(5)
Measuring performance
185(1)
Training the classifier
186(1)
9.5 Tips and tricks to improve performance
187(1)
9.6 Application of convolutional neural networks
188(1)
9.7 Summary
188(1)
10 Recurrent neural networks
189(12)
10.1 Contextual information
190(1)
10.2 Introduction to recurrent neural networks
190(2)
10.3 Implementing a recurrent neural network
192(3)
10.4 A predictive model for time-series data
195(3)
10.5 Application of recurrent neural networks
198(1)
10.6 Summary
199(2)
11 Sequence-to-sequence models for chatbots
201(22)
11.1 Building on classification and RNNs
202(3)
11.2 Seq2seq architecture
205(5)
11.3 Vector representation of symbols
210(2)
11.4 Putting it all together
212(8)
11.5 Gathering dialogue data
220(2)
11.6 Summary
222(1)
12 Utility landscape
223(18)
12.1 Preference model
226(5)
12.2 Image embedding
231(3)
12.3 Ranking images
234(5)
12.4 Summary
239(1)
12.5 What's next?
239(2)
Appendix Installation 241(6)
Index 247
AUTHOR BIO





Nishant Shukla is a computer vision researcher at UCLA, focusing on

machine learning techniques with robotics. He has been a developer for

Microsoft, Facebook, and Foursquare, and a machine learning engineer for





SpaceX, as well as the author of the Haskell Data Analysis Cookbook.