Muutke küpsiste eelistusi

E-raamat: Introduction to Natural Language Processing

Teised raamatud teemal:
  • Formaat - PDF+DRM
  • Hind: 156,00 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
Teised raamatud teemal:

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

A survey of computational methods for understanding, generating, and manipulating human language, which offers a synthesis of classical representations and algorithms with contemporary machine learning techniques.

A survey of computational methods for understanding, generating, and manipulating human language, which offers a synthesis of classical representations and algorithms with contemporary machine learning techniques.

This textbook provides a technical perspective on natural language processing—methods for building computer software that understands, generates, and manipulates human language. It emphasizes contemporary data-driven approaches, focusing on techniques from supervised and unsupervised machine learning. The first section establishes a foundation in machine learning by building a set of tools that will be used throughout the book and applying them to word-based textual analysis. The second section introduces structured representations of language, including sequences, trees, and graphs. The third section explores different approaches to the representation and analysis of linguistic meaning, ranging from formal logic to neural word embeddings. The final section offers chapter-length treatments of three transformative applications of natural language processing: information extraction, machine translation, and text generation. End-of-chapter exercises include both paper-and-pencil analysis and software implementation.

The text synthesizes and distills a broad and diverse research literature, linking contemporary machine learning techniques with the field's linguistic and computational foundations. It is suitable for use in advanced undergraduate and graduate-level courses and as a reference for software engineers and data scientists. Readers should have a background in computer programming and college-level mathematics. After mastering the material presented, students will have the technical skill to build and analyze novel natural language processing systems and to understand the latest research in the field.

Preface ix
Notation xiii
1 Introduction
1(10)
1.1 Natural Language Processing and Its Neighbors
1(4)
1.2 Three Themes in Natural Language Processing
5(6)
I LEARNING
11(106)
2 Linear Text Classification
13(34)
2.1 The Bag of Words
13(4)
2.2 Naive Bayes
17(7)
2.3 Discriminative Learning
24(4)
2.4 Loss Functions and Large-Margin Classification
28(6)
2.5 Logistic Regression
34(3)
2.6 Optimization
37(3)
2.7 *Additional Topics in Classification
40(2)
2.8 Summary of Learning Algorithms
42(5)
3 Nonlinear Classification
47(20)
3.1 Feedforward Neural Networks
48(2)
3.2 Designing Neural Networks
50(3)
3.3 Learning Neural Networks
53(8)
3.4 Convolutional Neural Networks
61(6)
4 Linguistic Applications of Classification
67(24)
4.1 Sentiment and Opinion Analysis
67(4)
4.2 Word Sense Disambiguation
71(3)
4.3 Design Decisions for Text Classification
74(4)
4.4 Evaluating Classifiers
78(7)
4.5 Building Datasets
85(6)
5 Learning without Supervision
91(26)
5.1 Unsupervised Learning
91(8)
5.2 Applications of Expectation-Maximization
99(3)
5.3 Semi-Supervised Learning
102(3)
5.4 Domain Adaptation
105(4)
5.5 *Other Approaches to Learning with Latent Variables
109(8)
II SEQUENCES AND TREES
117(150)
6 Language Models
119(18)
6.1 Af-Gram Language Models
120(2)
6.2 Smoothing and Discounting
122(5)
6.3 Recurrent Neural Network Language Models
127(5)
6.4 Evaluating Language Models
132(2)
6.5 Out-of-Vocabulary Words
134(3)
7 Sequence Labeling
137(30)
7.1 Sequence Labeling as Classification
137(2)
7.2 Sequence Labeling as Structure Prediction
139(1)
7.3 The Viterbi Algorithm
140(5)
7.4 Hidden Markov Models
145(4)
7.5 Discriminative Sequence Labeling with Features
149(9)
7.6 Neural Sequence Labeling
158(3)
7.7 ""Unsupervised Sequence Labeling
161(6)
8 Applications of Sequence Labeling
167(16)
8.1 Part-of-Speech Tagging
167(6)
8.2 Morphosyntactic Attributes
173(2)
8.3 Named Entity Recognition
175(1)
8.4 Tokenization
176(1)
8.5 Code Switching
177(1)
8.6 Dialogue Acts
178(5)
9 Formal Language Theory
183(32)
9.1 Regular Languages
184(14)
9.2 Context-Free Languages
198(11)
9.3 *Mildty Context-Sensitive Languages
209(6)
10 Context-Free Parsing
215(28)
10.1 Deterministic Bottom-Up Parsing
216(3)
10.2 Ambiguity
219(3)
10.3 Weighted Context-Free Grammars
222(5)
10.4 Learning Weighted Context-Free Grammars
227(4)
10.5 Grammar Refinement
231(7)
10.6 Beyond Context-Free Parsing
238(5)
11 Dependency Parsing
243(24)
11.1 Dependency Grammar
243(5)
11.2 Graph-Based Dependency Parsing
248(5)
11.3 Transition-Based Dependency Parsing
253(8)
11.4 Applications
261(6)
III MEANING
267(110)
12 Logical Semantics
269(20)
12.1 Meaning and Denotation
270(1)
12.2 Logical Representations of Meaning
270(4)
12.3 Semantic Parsing and the Lambda Calculus
274(6)
12.4 Learning Semantic Parsers
280(9)
13 Predicate-Argument Semantics
289(20)
13.1 Semantic Roles
291(4)
13.2 Semantic Role Labeling
295(7)
13.3 Abstract Meaning Representation
302(7)
14 Distributional and Distributed Semantics
309(24)
14.1 The Distributional Hypothesis
309(2)
14.2 Design Decisions for Word Representations
311(2)
14.3 Latent Semantic Analysis
313(2)
14.4 Brown Clusters
315(2)
14.5 Neural Word Embeddings
317(5)
14.6 Evaluating Word Embeddings
322(2)
14.7 Distributed Representations beyond Distributional Statistics
324(3)
14.8 Distributed Representations of Multiword Units
327(6)
15 Reference Resolution
333(24)
15.1 Forms of Referring Expressions
334(5)
15.2 Algorithms for Coreference Resolution
339(9)
15.3 Representations for Coreference Resolution
348(5)
15.4 Evaluating Coreference Resolution
353(4)
16 Discourse
357(20)
16.1 Segments
357(2)
16.2 Entities and Reference
359(3)
16.3 Relations
362(15)
IV APPLICATIONS
377
17 Information Extraction
379
17.1 Entities
381(6)
17.2 Relations
387