Muutke küpsiste eelistusi

Deep Learning on Graphs [Kõva köide]

(Michigan State University), (Michigan State University)
  • Formaat: Hardback, 400 pages, kõrgus x laius x paksus: 234x155x23 mm, kaal: 610 g, Worked examples or Exercises
  • Ilmumisaeg: 23-Sep-2021
  • Kirjastus: Cambridge University Press
  • ISBN-10: 1108831745
  • ISBN-13: 9781108831741
  • Formaat: Hardback, 400 pages, kõrgus x laius x paksus: 234x155x23 mm, kaal: 610 g, Worked examples or Exercises
  • Ilmumisaeg: 23-Sep-2021
  • Kirjastus: Cambridge University Press
  • ISBN-10: 1108831745
  • ISBN-13: 9781108831741
This comprehensive text on the theory and techniques of graph neural networks takes students, practitioners, and researchers from the basics to the state of the art. It systematically introduces foundational topics such as filtering pooling, robustness, and scalability and then demonstrates applications in NLP, data mining, vision and healthcare.

Deep learning on graphs has become one of the hottest topics in machine learning. The book consists of four parts to best accommodate our readers with diverse backgrounds and purposes of reading. Part 1 introduces basic concepts of graphs and deep learning; Part 2 discusses the most established methods from the basic to advanced settings; Part 3 presents the most typical applications including natural language processing, computer vision, data mining, biochemistry and healthcare; and Part 4 describes advances of methods and applications that tend to be important and promising for future research. The book is self-contained, making it accessible to a broader range of readers including (1) senior undergraduate and graduate students; (2) practitioners and project managers who want to adopt graph neural networks into their products and platforms; and (3) researchers without a computer science background who want to use graph neural networks to advance their disciplines.

Arvustused

'This timely book covers a combination of two active research areas in AI: deep learning and graphs. It serves the pressing need for researchers, practitioners, and students to learn these concepts and algorithms, and apply them in solving real-world problems. Both authors are world-leading experts in this emerging area.' Huan Liu, Arizona State University 'Deep learning on graphs is an emerging and important area of research. This book by Yao Ma and Jiliang Tang covers not only the foundations, but also the frontiers and applications of graph deep learning. This is a must-read for anyone considering diving into this fascinating area.' Shuiwang Ji, Texas A&M University 'The first textbook of Deep Learning on Graphs, with systematic, comprehensive and up-to-date coverage of graph neural networks, autoencoder on graphs, and their applications in natural language processing, computer vision, data mining, biochemistry and healthcare. A valuable book for anyone to learn this hot theme!' Jiawei Han, University of Illinois at Urbana-Champaign 'This book systematically covers the foundations, methodologies, and applications of deep learning on graphs. Especially, it comprehensively introduces graph neural networks and their recent advances. This book is self-contained and nicely structured and thus suitable for readers with different purposes. I highly recommend those who want to conduct research in this area or deploy graph deep learning techniques in practice to read this book.' Charu Aggarwal, Distinguished Research Staff Member at IBM and recipient of the W. Wallace McDowell Award

Muu info

A comprehensive text on foundations and techniques of graph neural networks with applications in NLP, data mining, vision and healthcare.
Preface xiii
Acknowledgments xvii
1 Deep Learning on Graphs: An Introduction
1(14)
1.1 Introduction
1(1)
1.2 Why Deep Learning on Graphs?
1(2)
1.3 What Content Is Covered?
3(3)
1.4 Who Should Read This Book?
6(2)
1.5 Feature Learning on Graphs: A Brief History
8(5)
1.5.1 Feature Selection on Graphs
9(1)
1.5.2 Representation Learning on Graphs
10(3)
1.6 Conclusion
13(1)
1.7 Further Reading
13(2)
Part I Foundations 15(58)
2 Foundations of Graphs
17(26)
2.1 Introduction
17(1)
2.2 Graph Representations
18(1)
2.3 Properties and Measures
19(7)
2.3.1 Degree
19(2)
2.3.2 Connectivity
21(2)
2.3.3 Centrality
23(3)
2.4 Spectral Graph Theory
26(3)
2.4.1 Laplacian Matrix
26(2)
2.4.2 The Eigenvalues and Eigenvectors of the Laplacian Matrix
28(1)
2.5 Graph Signal Processing
29(4)
2.5.1 Graph Fourier Transform
30(3)
2.6 Complex Graphs
33(6)
2.6.1 Heterogeneous Graphs
33(1)
2.6.2 Bipartite Graphs
33(1)
2.6.3 Multidimensional Graphs
34(1)
2.6.4 Signed Graphs
35(1)
2.6.5 Hypergraphs
36(1)
2.6.6 Dynamic Graphs
37(2)
2.7 Computational Tasks on Graphs
39(3)
2.7.1 Node-Focused Tasks
39(2)
2.7.2 Graph-Focused Tasks
41(1)
2.8 Conclusion
42(1)
2.9 Further Reading
42(1)
3 Foundations of Deep Learning
43(30)
3.1 Introduction
43(1)
3.2 Deep Feedforward Networks
44(7)
3.2.1 The Architecture
46(1)
3.2.2 Activation Functions
47(3)
3.2.3 Output Layer and Loss Function
50(1)
3.3 Convolutional Neural Networks
51(8)
3.3.1 The Convolution Operation and Convolutional Layer
52(4)
3.3.2 Convolutional Layers in Practice
56(1)
3.3.3 Nonlinear Activation Layer
57(1)
3.3.4 Pooling Layer
58(1)
3.3.5 An Overall CNN Framework
58(1)
3.4 Recurrent Neural Networks
59(4)
3.4.1 The Architecture of Traditional RNNs
60(1)
3.4.2 Long Short-Term Memory
61(2)
3.4.3 Gated Recurrent Unit
63(1)
3.5 Autoencoders
63(4)
3.5.1 Undercomplete Autoencoders
65(1)
3.5.2 Regularized Autoencoders
66(1)
3.6 Training Deep Neural Networks
67(4)
3.6.1 Training with Gradient Descent
67(1)
3.6.2 Backpropagation
68(3)
3.6.3 Preventing Overfitting
71(1)
3.7 Conclusion
71(1)
3.8 Further Reading
72(1)
Part II Methods 73(132)
4 Graph Embedding
75(32)
4.1 Introduction
75(2)
4.2 Graph Embedding for Simple Graphs
77(17)
4.2.1 Preserving Node Co-occurrence
77(9)
4.2.2 Preserving Structural Role
86(3)
4.2.3 Preserving Node Status
89(2)
4.2.4 Preserving Community Structure
91(3)
4.3 Graph Embedding on Complex Graphs
94(11)
4.3.1 Heterogeneous Graph Embedding
94(2)
4.3.2 Bipartite Graph Embedding
96(1)
4.3.3 Multidimensional Graph Embedding
97(2)
4.3.4 Signed Graph Embedding
99(3)
4.3.5 Hypergraph Embedding
102(2)
4.3.6 Dynamic Graph Embedding
104(1)
4.4 Conclusion
105(1)
4.5 Further Reading
106(1)
5 Graph Neural Networks
107(31)
5.1 Introduction
107(2)
5.2 The General GNN Frameworks
109(3)
5.2.1 A General Framework for Node-Focused Tasks
109(1)
5.2.2 A General Framework for Graph-Focused Tasks
110(2)
5.3 Graph Filters
112(16)
5.3.1 Spectral-Based Graph Filters
112(10)
5.3.2 Spatial-Based Graph Filters
122(6)
5.4 Graph Pooling
128(7)
5.4.1 Flat Graph Pooling
129(1)
5.4.2 Hierarchical Graph Pooling
130(5)
5.5 Parameter Learning for Graph Neural Networks
135(1)
5.5.1 Parameter Learning for Node Classification
135(1)
5.5.2 Parameter Learning for Graph Classification
136(1)
5.6 Conclusion
136(1)
5.7 Further Reading
137(1)
6 Robust Graph Neural Networks
138(24)
6.1 Introduction
138(1)
6.2 Graph Adversarial Attacks
138(13)
6.2.1 Taxonomy of Graph Adversarial Attacks
139(2)
6.2.2 White-Box Attack
141(3)
6.2.3 Gray-Box Attack
144(4)
6.2.4 Black-Box Attack
148(3)
6.3 Graph Adversarial Defenses
151(9)
6.3.1 Graph Adversarial Training
152(2)
6.3.2 Graph Purification
154(1)
6.3.3 Graph Attention
155(4)
6.3.4 Graph Structure Learning
159(1)
6.4 Conclusion
160(1)
6.5 Further Reading
160(2)
7 Scalable Graph Neural Networks
162(14)
7.1 Introduction
162(4)
7.2 Node-wise Sampling Methods
166(2)
7.3 Layer-wise Sampling Methods
168(4)
7.4 Subgraph-wise Sampling Methods
172(2)
7.5 Conclusion
174(1)
7.6 Further Reading
175(1)
8 Graph Neural Networks for Complex Graphs
176(12)
8.1 Introduction
176(1)
8.2 Heterogeneous Graph Neural Networks
176(2)
8.3 Bipartite Graph Neural Networks
178(1)
8.4 Multidimensional Graph Neural Networks
179(2)
8.5 Signed Graph Neural Networks
181(3)
8.6 Hypergraph Neural Networks
184(1)
8.7 Dynamic Graph Neural Networks
185(2)
8.8 Conclusion
187(1)
8.9 Further Reading
187(1)
9 Beyond GNNs: More Deep Models on Graphs
188(17)
9.1 Introduction
188(1)
9.2 Autoencoders on Graphs
189(2)
9.3 Recurrent Neural Networks on Graphs
191(2)
9.4 Variational Autoencoders on Graphs
193(6)
9.4.1 Variational Autoencoders for Node Representation Learning
195(1)
9.4.2 Variational Autoencoders for Graph Generation
196(3)
9.5 Generative Adversarial Networks on Graphs
199(4)
9.5.1 Generative Adversarial Networks for Node Representation Learning
200(1)
9.5.2 Generative Adversarial Networks for Graph Generation
201(2)
9.6 Conclusion
203(1)
9.7 Further Reading
203(2)
Part III Applications 205(60)
10 Graph Neural Networks in Natural Language Processing
207(15)
10.1 Introduction
207(1)
10.2 Semantic Role Labeling
208(3)
10.3 Neural Machine Translation
211(1)
10.4 Relation Extraction
211(2)
10.5 Question Answering
213(3)
10.5.1 The Multihop QA Task
213(1)
10.5.2 Entity-GCN
214(2)
10.6 Graph to Sequence Learning
216(2)
10.7 Graph Neural Networks on Knowledge Graphs
218(3)
10.7.1 Graph Filters for Knowledge Graphs
218(1)
10.7.2 Transforming Knowledge Graphs to Simple Graphs
219(1)
10.7.3 Knowledge Graph Completi6n
220(1)
10.8 Conclusion
221(1)
10.9 Further Reading
221(1)
11 Graph Neural Networks in Computer Vision
222(14)
11.1 Introduction
222(1)
11.2 Visual Question Answering
222(5)
11.2.1 Images as Graphs
224(1)
11.2.2 Images and Questions as Graphs
225(2)
11.3 Skeleton-Based Action Recognition
227(2)
11.4 Image Classification
229(4)
11.4.1 Zero-Shot Image Classification
230(1)
11.4.2 Few-Shot Image Classification
231(1)
11.4.3 Multilabel Image Classification
232(1)
11.5 Point Cloud Learning
233(1)
11.6 Conclusion
234(1)
11.7 Further Reading
235(1)
12 Graph Neural Networks in Data Mining
236(16)
12.1 Introduction
236(1)
12.2 Web Data Mining
236(8)
12.2.1 Social Network Analysis
237(3)
12.2.2 Recommender Systems
240(4)
12.3 Urban Data Mining
244(3)
12.3.1 Traffic Prediction
244(2)
12.3.2 Air Quality Forecasting
246(1)
12.4 Cybersecurity Data Mining
247(3)
12.4.1 Malicious Account Detection
247(2)
12.4.2 Fake News Detection
249(1)
12.5 Conclusion
250(1)
12.6 Further Reading
251(1)
13 Graph Neural Networks in Biochemistry and Health Care
252(13)
13.1 Introduction
252(1)
13.2 Drug Development and Discovery
252(6)
13.2.1 Molecule Representation Learning
253(1)
13.2.2 Protein Interface Prediction
254(2)
13.2.3 Drug-Target Binding Affinity Prediction
256(2)
13.3 Drug Similarity Integration
258(1)
13.4 Polypharmacy Side Effect Prediction
259(3)
13.5 Disease Prediction
262(2)
13.6 Conclusion
264(1)
13.7 Further Reading
264(1)
Part IV Advances 265(24)
14 Advanced Topics in Graph Neural Networks
267(14)
14.1 Introduction
267(1)
14.2 Deeper Graph Neural Networks
268(3)
14.2.1 Jumping Knowledge
270(1)
14.2.2 DropEdge
270(1)
14.2.3 PairNorm
270(1)
14.3 Exploring Unlabeled Data via Self-Supervised Learning
271(4)
14.3.1 Node-Focused Tasks
271(3)
14.3.2 Graph-Focused Tasks
274(1)
14.4 Expressiveness of Graph Neural Networks
275(4)
14.4.1 Weisfeiler-Lehman Test
276(2)
14.4.2 Expressiveness
278(1)
14.5 Conclusion
279(1)
14.6 Further Reading
279(2)
15 Advanced Applications in Graph Neural Networks
281(8)
15.1 Introduction
281(1)
15.2 Combinatorial Optimization on Graphs
281(2)
15.3 Learning Program Representations
283(2)
15.4 Reasoning Interacting Dynamical Systems in Physics
285(1)
15.5 Conclusion
286(1)
15.6 Further Reading
286(3)
Bibliography 289(26)
Index 315
Yao Ma is a PhD student of the Department of Computer Science and Engineering at Michigan State University (MSU). He is the recipient of the Outstanding Graduate Student Award and FAST Fellowship at MSU. He has published papers in top conferences such as WSDM, ICDM, SDM, WWW, IJCAI, SIGIR and KDD, which have been cited hundreds of times. He is the leading organizer and presenter of tutorials on GNNs at AAAI'20, KDD'20 and AAAI'21, which received huge attention and wide acclaim. He has served as Program Committee Members/Reviewers in many well-known conferences and magazines such as AAAI, BigData, IJCAI, TWEB, TKDD and TPAMI. Jiliang Tang is Assistant Professor in the Department of Computer Science and Engineering at Michigan State University. Previously, he was a research scientist in Yahoo Research. He received the 2020 SIGKDD Rising Star Award, 2020 Distinguished Withrow Research Award, 2019 NSF Career Award, the 2019 IJCAI Early Career Invited Talk and 7 best paper (runnerup) awards. He has organized top data science conferences including KDD, WSDM and SDM, and is associate editor of the TKDD journal. His research has been published in highly ranked journals and top conferences, and received more than 12,000 citations with h-index 55 and extensive media coverage.